Batch-normalized Maxout Network in Network

نویسندگان

  • Jia-Ren Chang
  • Yong-Sheng Chen
چکیده

This paper reports a novel deep architecture referred to as Maxout network In Network (MIN), which can enhance model discriminability and facilitate the process of information abstraction within the receptive field. The proposed network adopts the framework of the recently developed Network In Network structure, which slides a universal approximator, multilayer perceptron (MLP) with rectifier units, to exact features. Instead of MLP, we employ maxout MLP to learn a variety of piecewise linear activation functions and to mediate the problem of vanishing gradients that can occur when using rectifier units. Moreover, batch normalization is applied to reduce the saturation of maxout units by pre-conditioning the model and dropout is applied to prevent overfitting. Finally, average pooling is used in all pooling layers to regularize maxout MLP in order to facilitate information abstraction in every receptive field while tolerating the change of object position. Because average pooling preserves all features in the local patch, the proposed MIN model can enforce the suppression of irrelevant information during training. Our experiments demonstrated the state-of-the-art classification performance when the MIN model was applied to MNIST, CIFAR-10, and CIFAR-100 datasets and comparable performance for SVHN dataset.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Convolutional deep maxout networks for phone recognition

Convolutional neural networks have recently been shown to outperform fully connected deep neural networks on several speech recognition tasks. Their superior performance is due to their convolutional structure that processes several, slightly shifted versions of the input window using the same weights, and then pools the resulting neural activations. This pooling operation makes the network les...

متن کامل

Systematic evaluation of convolution neural network advances on the Imagenet

The paper systematically studies the impact of a range of recent advances in CNN architectures and learning methods on the object categorization (ILSVRC) problem. The evalution tests the influence of the following choices of the architecture: non-linearity (ReLU, ELU, maxout, compatability with batch normalization), pooling variants (stochastic, max, average, mixed), network width, classifier d...

متن کامل

Improving Deep Neural Networks with Probabilistic Maxout Units

We present a probabilistic variant of the recently introduced maxout unit. The success of deep neural networks utilizing maxout can partly be attributed to favorable performance under dropout, when compared to rectified linear units. It however also depends on the fact that each maxout unit performs a pooling operation over a group of linear transformations and is thus partially invariant to ch...

متن کامل

Predicting the Effective Factors on Musculoskeletal Disorders among Kerman University of Medical Sciences Computer Users through Neural Network Algorithm in 2018

Introduction: In the past 20 years, computers and their workplaces have increased at both offices and houses, which consequently has led to saving in time, energy and resources. This study aimed to weight risk factors of musculoskeletal disorders among computer users through neural network. Methods: A cross-sectional study was carried out at 200 stations in Kerman University of Medical Sciences...

متن کامل

Question Answering System using Dynamic Coattention Networks

We tackle the difficult problem of building a question answering system by building an end-to-end recurrent neural using network sequence-to-sequence model. We use the coattention encoder and explore three different decoders: linear, single layer maxout, and highway maxout network. We train and evaluate our model using the recently published Stanford Question and Answering Dataset (SQuAD). Out ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1511.02583  شماره 

صفحات  -

تاریخ انتشار 2015